Add new validation data to booster.
Get number of predictions.
Create a new boosting learner.
Load an existing booster from model file.
Dump model to JSON.
Get model feature importance.
Free space for booster.
Method corresponding to LGBM_BoosterPredictSparseOutput
to free the allocated data.
Get index of the current boosting iteration.
Get evaluation for training data and validation data.
Get number of evaluation metrics.
Get names of evaluation metrics.
Get names of features.
Get leaf value.
Get int representing whether booster is fitting linear trees.
Get parameters as JSON string.
Get model lower bound value.
Get number of classes.
Get number of features.
Get number of predictions for training data and validation data
(this can be used to support customized evaluation functions).
Get prediction for training data and validation data.
Get model upper bound value.
Load an existing booster from string.
Merge model from other_handle
into handle
.
Get number of trees per iteration.
Get number of weak sub-models.
Make prediction for a new dataset in CSC format.
Make prediction for a new dataset in CSR format.
Make prediction for a new dataset in CSR format. This method re-uses the internal predictor structure
from previous calls and is optimized for single row invocation.
Faster variant of LGBM_BoosterPredictForCSRSingleRow
.
Initialize and return a FastConfigHandle
for use with LGBM_BoosterPredictForCSRSingleRowFast
.
Make prediction for file.
Make prediction for a new dataset.
Make prediction for a new dataset. This method re-uses the internal predictor structure
from previous calls and is optimized for single row invocation.
Faster variant of LGBM_BoosterPredictForMatSingleRow
.
Initialize and return a FastConfigHandle
for use with LGBM_BoosterPredictForMatSingleRowFast
.
Make prediction for a new dataset presented in a form of array of pointers to rows.
Make sparse prediction for a new dataset in CSR or CSC format. Currently only used for feature contributions.
Refit the tree model using the new data (online learning).
Reset config for booster.
Reset training data for booster.
Rollback one iteration.
Save model into file.
Save model to string.
Set leaf value.
Shuffle models.
Update the model for one iteration.
Update the model by specifying gradient and Hessian directly
(this can be used to support customized loss functions).
Check that the feature names of the data match the ones used to train the booster.
Free space for byte buffer.
Get a ByteBuffer value at an index.
Add features from source
to target
.
Allocate the space for dataset and bucket feature bins according to reference dataset.
Create a dataset from CSC format.
Create a dataset from CSR format.
Create a dataset from CSR format through callbacks.
Load dataset from file (like LightGBM CLI version does).
Create dataset from dense matrix.
Create dataset from array of dense matrices.
Allocate the space for dataset and bucket feature bins according to sampled data.
Allocate the space for dataset and bucket feature bins according to serialized reference dataset.
Save dataset to text file, intended for debugging use only.
Free space for dataset.
Get feature names of dataset.
Get number of bins for feature.
Get info vector from dataset.
Get number of data points.
Get number of features.
Create subset of a data.
Initialize the Dataset for streaming.
Mark the Dataset as complete by calling dataset->FinishLoad
.
Push data to existing dataset, if nrow + start_row == num_total_row
, will call dataset->FinishLoad
.
Push data to existing dataset, if nrow + start_row == num_total_row
, will call dataset->FinishLoad
.
Push CSR data to existing dataset. (See LGBM_DatasetPushRowsWithMetadata
for more details.)
Push data to existing dataset.
The general flow for a streaming scenario is:
Save dataset to binary file.
Create a dataset schema representation as a binary byte array (excluding data).
Save feature names to dataset.
Set vector to a content in info.
Set whether or not the Dataset waits for a manual MarkFinished call or calls FinishLoad on itself automatically.
Set to 1 for streaming scenario, and use LGBM_DatasetMarkFinished
to manually finish the Dataset.
Raise errors for attempts to update dataset parameters.
Dump all parameter names with their aliases to JSON.
Release FastConfig object.
Get string message of the last error.
Get number of samples based on parameters and total number of rows of data.
Finalize the network.
Initialize the network.
Initialize the network with external collective functions.
Register a callback function for log redirecting.
Create sample indices for total number of rows.
Set string message of the last error.